Skip to content

Conversation

@waynebruce0x
Copy link
Collaborator

No description provided.

@waynebruce0x waynebruce0x marked this pull request as draft October 7, 2025 15:04
Copy link
Member

@g1nt0ki g1nt0ki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, dont like the idea of using r2 strings as a db, can you replace it with a dedicated index (table in ES) in our coins ES db? are we storing the token mcaps in coins-current index now?


export default async function getTokenPrices(timestamp: number) {
const writes: Write[] = [];
const assets = await getR2JSONString(r2Key);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so, this either get manually triggered?
or we store the same distressed list to ddb every hour?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

currently we store every hour but we could change this by

  • writing new distressed assets to SK=0 so we can fetch metadata
  • adding a case to coins endpoints where latency checks are skipped where isDistressed() returns true
  • adding cases to deficoins, bridged, coingecko where writes are removed when isDistressed() returns true

function checkMcaps(address: string, mcapData: any, usdAmount: number, promises: Promise<void>[]) {
const mcap = mcapData[address];
if (mcap && usdAmount > mcap) {
promises.push(addToDistressedList(address));
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah, this is the only place when we add a token to the potential distressed list? so, the r2 object is more of a placeholder?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We would also add tokens to the list through the UI tool

});

appendToStaleCoins(usdTvl, staleCoinsInclusive, staleCoins);
await Promise.all(distressedCoinsPromises);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would design it slightly differently, one solution is, we send logs to ES (not the coins one but the one we use for logging/linked to our sdk), one per token& project, then hourly, we can pull these logs and show notification, you can check the implementation here:

logging: https://github.com/DefiLlama/defillama-server/blob/master/defi/src/adaptors/handlers/storeAdaptorData/index.ts#L353
usage: https://github.com/DefiLlama/defillama-server/blob/master/defi/src/notifyOutdated.ts#L106


other option is gathering all the distressed coins and writing only once to r2,


either way, I would avoid promise.all when we dont know how many tokens will be added to the list. Another issue with current approach is, it is ripe for losing tokens/overwriting because of race condition

@waynebruce0x waynebruce0x marked this pull request as ready for review October 9, 2025 14:50
@waynebruce0x waynebruce0x marked this pull request as draft October 9, 2025 14:50
});

return false;
if (isLocalClient) await client?.close();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can leave the connection open, can close it once when process exits, noticed this pattern in other places as well, not sure if it is a good idea to open & close connection per requests, most of our scripts run under an hour, so dont see why we need to close it per write

export async function readDistressedLogs() {
const esClient = elastic.getClient();
const hourAgo = Math.floor(Date.now() / 1000) - 3600;
let { lastCheckTS } = (await cache.readExpiringJsonCache("distressed-assets-last-check")) || { lastCheckTS: 0 };
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can use this: elastic.getAllLogs() and skip the cache?

);

await storeR2JSONString(r2Key, JSON.stringify(data));
await esClient?.close();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need to close this, or use expiring cache. can use the es index as single source of truth

@g1nt0ki g1nt0ki marked this pull request as ready for review October 22, 2025 13:02
import produceKafkaTopics from "../../utils/coins3/produce";
import { chainsThatShouldNotBeLowerCased } from "../../utils/shared/constants";
import { sendMessage } from "../../../../defi/src/utils/discord";
import { isDistressed } from "../../utils/shared/distressedCoins";
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I dislike the logic duplication here.

Can you consolidates all the writes to a single function writeToDB()

  • this would do all the validation and filtering
  • then write to both ddb & kafka

this way, say I want to write something from the ui-tool or anywhere else, I know what to call, a single function that takes care of validation and storage, and in the future, we want to migrate to some other db, we update a single function


const mcapCache: { [PK: string]: any } = {}

async function getMcapData(readKeys: string[], timestamp: string | number): Promise<any> {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good, can you move this and the price function to coinsApi.ts in the same folder, these functions can be used elsewhere as well which have nothing to do with tvl

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, on second thought, if you use latest sdk version, it already has functions that return cached price & mcap info?

}

// store distressed metadata to DDB SK0 for metadata retrieval
export async function storeDistressedCoins(keys: string[], coinsESClient?: any) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lets add a new field isDistressed to the price object, so the things we want to set as distressed, we set that flag true,

so here, for a given key list, we pull current metadata objects (SK=0 on ddb), set the flag and call writeToDB which will take care of storing

}

// get list of possible distressed coins from ES logs in the last week
export async function readDistressedLogs() {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good

}

// write sorted logs, and manual entries, to the assets store
export async function addToDistressed(keys: string[]) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good

}

// batch check if a list of coins are distressed
export async function batchIsDistressed(keys: string[]) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you add the isDistressed field to the price api response, I will update the sdk to include that in getPrices response

then, for the batch call here, we check if the metadata response has isDistressed set to true

Copy link
Member

@g1nt0ki g1nt0ki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants